147 research outputs found

    Decision models for fast-fashion supply and stocking problems in internet fulfillment warehouses

    Get PDF
    Internet technology is being widely used to transform all aspects of the modern supply chain. Specifically, accelerated product flows and wide spread information sharing across the supply chain have generated new sets of decision problems. This research addresses two such problems. The first focuses on fast fashion supply chains in which inventory and price are managed in real time to maximize retail cycle revenue. The second is concerned with explosive storage policies in Internet Fulfillment Warehouses (IFW). Fashion products are characterized by short product life cycles and market success uncertainty. An unsuccessful product will often require multiple price discounts to clear the inventory. The first topic proposes a switching solution for fast-fashion retailers who have preordered an initial or block inventory, and plan to use channel switching as opposed to multiple discounting steps. The FFS Multi-Channel Switching (MCS) problem then is to monitor real-time demand and store inventory, such that at the optimal period the remaining store inventory is sold at clearance, and the warehouse inventory is switched to the outlet channel. The objective is to maximize the total revenue. With a linear projection of the moving average demand trend, an estimation of the remaining cycle revenue at any time in the cycle is shown to be a concave function of the switching time. Using a set of conditions the objective is further simplified into cases. The Linear Moving Average Trend (LMAT) heuristic then prescribes whether a channel switch should be made in the next period. The LMAT is compared with the optimal policy and the No-Switch and Beta-Switch rules. The LMAT performs very well and the majority of test problems provide a solution within 0.4% of the optimal. This confirms that LMAT can readily and effectively be applied to real time decision making in a FFS. An IFW is a facility built and operated exclusively for online retail, and a key differentiator is the explosive storage policy. Breaking the single stocking location tradition, in an IFW small batches of the same stock keeping unit (SKU) are dispersed across the warehouse. Order fulfillment time performance is then closely related to the storage location decision, that is, for every incoming bulk, what is the specific storage location for each batch. Faster fulfillment is possible when SKUs are clustered such that narrow band picklists can be efficiently generated. Stock location decisions are therefore a function of the demand arrival behavior and correlations with other SKUs. Faster fulfillment is possible when SKUs are clustered such that narrow band picklists can be efficiently generated. Stock location decisions are therefore a function of the demand behavior and correlations with other SKUs. A Joint Item Correlation and Density Oriented (JICDO) Stocking Algorithm is developed and tested. JICDO is formulated to increase the probability that M pick able order items are stocked in a δ band of storage locations. It scans the current inventory dispersion to identify location bands with low SKU density and combines the storage affinity with correlated items. In small problem testing against a MIP formulation and large scale testing in a simulator the JICDO performance is confirmed

    Using Omnichannel Sales Data Analytics to Decide Between Store and Distribution Center Fulfillment Options

    Get PDF
    A brick-and-mortar retailer can fulfill online customer orders in two ways (i) Buy Online Fulfill from Store (BOFS) - Picked from store inventory, and (ii) Fulfill from Distribution Center (FDC) - Picked from DC or warehouse inventory. The fulfillment decision is made in real time for each order, with the primary goal of maximizing the revenue value of the store inventory. Analysis of sales data in both online and store channels is used to forecast the value of the dispersed inventory, and then develop a prescriptive model for making a fulfillment decision

    SheetCopilot: Bringing Software Productivity to the Next Level through Large Language Models

    Full text link
    Computer end users have spent billions of hours completing daily tasks like tabular data processing and project timeline scheduling. Most of these tasks are repetitive and error-prone, yet most end users lack the skill to automate these burdensome works. With the advent of large language models (LLMs), directing software with natural language user requests become a reachable goal. In this work, we propose a SheetCopilot agent that takes natural language task and control spreadsheet to fulfill the requirements. We propose a set of atomic actions as an abstraction of spreadsheet software functionalities. We further design a state machine-based task planning framework for LLMs to robustly interact with spreadsheets. We curate a representative dataset containing 221 spreadsheet control tasks and establish a fully automated evaluation pipeline for rigorously benchmarking the ability of LLMs in software control tasks. Our SheetCopilot correctly completes 44.3\% of tasks for a single generation, outperforming the strong code generation baseline by a wide margin. Our project page:https://sheetcopilot.github.io/.Comment: Accepted to NeurIPS 202

    The bleaching limits of IRSL signals at various stimulation temperatures and their potential inference of the pre-burial light exposure duration

    Get PDF
    Infrared Stimulated Luminescence (IRSL) techniques are being increasingly used for dating sedimentary feldspars in the middle to late Quaternary. By employing several subsequent stimulations at increasing temperatures, a series of post-IR IRSL (pIRIR) signals with different characteristics (stability and bleachability) can be obtained for an individual sample. It has been experimentally demonstrated that higher-temperature pIRIR signals are more stable, but they tend to exhibit larger residual doses up to few tens of Gy, potentially causing severe age overestimation in young samples. In this study we conducted comprehensive bleaching experiments of IRSL and pIRIR signals using a loess sample from China, and demonstrated that non-bleachable components in the IR (and possibly pIRIR) signals do exist. The level of such non-bleachable signal shows clearly positive correlation with preheat/stimulation temperature, which further supports the notion that lower temperature pIRIR are advantageous to date young samples and sediments especially from difficult-to-bleach environments. These results display a potential in constrain the pre-burial light exposure history of sediment utilizing multiple feldspar post-IR IRSL (pIRIR) signals. For the studied loess sample, we infer that prior to its last burial, the sample has received an equivalent of >264 h exposure to the SOL2 simulator (more than 2,000 h of natural daylight)

    What has affected the governance effect of the whole population coverage of medical insurance in China in the past decade? Lessons for other countries

    Get PDF
    ObjectiveThis study aimed to explore the current state of governance of full population coverage of health insurance in China and its influencing factors to provide empirical references for countries with similar social backgrounds as China.MethodsA cross-sectional quantitative study was conducted nationwide between 22 January 2020 and 26 January 2020, with descriptive statistics, analysis of variance, and logistic regression models via SPSS 25.0 to analyze the effectiveness and influencing factors of the governance of full population coverage of health insurance in China.ResultsThe effectiveness of the governance relating to the total population coverage of health insurance was rated as good by 59% of the survey respondents. According to the statistical results, the governance of the public's ability to participate in insurance (OR = 1.516), the degree of information construction in the medical insurance sector (OR = 2.345), the government's governance capacity (OR = 4.284), and completeness of the government's governance tools (OR = 1.370) were all positively correlated (p < 0.05) on the governance effect of the whole population coverage of health insurance.ConclusionsThe governance of Chinese health insurance relating to the total population coverage is effective. To effectively improve the effectiveness of the governance relating to the total population coverage of health insurance, health insurance information construction, governance capacity, and governance tools should be the focus of governance to further improve the accurate expansion of and increase the coverage of health insurance

    Capitalizing on Construction Records to Identify Relationships between Construction and Long-term Project Performance: Final Report

    Get PDF
    0-7028TxDOT keeps records of contracted roadway projects in several databases: materials and test records, collected for the quality control / quality assurance program, in the SiteManager (SMGR) database; construction-related information in the Design and Construction Information System (DCIS); and performance measures in the Pavement Analyst (PA) database. The primary objective of this research was to utilize the vast amount of data in these databases to identify relationships between materials and construction records and observed long-term performance of hot-mix asphalt (HMA) pavements. Materials, construction records, and pavement surface conditions were analyzed using traditional regression analysis and new-generation data analysis tools. A few selected projects were also chosen for site inspection. The data analysis shows that binder content, binder grade, and recycled binder content in\ufb02uence a pavement\u2019s performance for a given service life and traf\ufb01c volume. Site visits of a selected sample of projects showed that some pavements were treated with overlays that were not captured by the DCIS or PA databases. Two recommendations are made for TxDOT to continue to bene\ufb01t from the \ufb01ndings of this research project. These are to implement the integrated database into a software platform already available to TxDOT Tableau), and to incorporate elements related to maintenance activities into this integrated database
    corecore